Web Survey Bibliography
This paper reports two studies investigating the impact of online survey sponsorship on dropout rates. In Study 1, 498 participants were randomised to one of four 78-item online surveys. All four surveys were identical in content, but differed in presentation format. The first pair of surveys were hosted on our faculty web-server using LimeSurvey, and preceded by an information page on our school website. Our university logo was featured prominently on every page of these surveys, representing a “high” level of university sponsorship. The remaining pair were entirely hosted on SurveyMonkey.com, and made minimal reference to our university (i.e., they represented a “low” level of university sponsorship). One version of each pair “forced” participants to answer every question on each page before continuing, whereas the other did not (i.e., all questions were “optional”). Overall, 13.9% of participants commenced, but did not complete the surveys. The proportion of participants completing the high sponsorship surveys did not differ from the proportion completing the low sponsorship surveys. Of those who completed the optional format surveys, members of the low sponsorship condition answered significantly more items than high sponsorship condition members. There was no such difference between members of the high and low conditions who did not complete the optional format surveys. However, LimeSurvey and SurveyMonkey differ in terms of basic page formatting, load speeds and several other factors, which could be responsible for these findings. These confounds were addressed in Study 2, in which 159 participants were randomised to one of two 65-item online surveys. Both were identical in content, utilised an optional response format, and were hosted on Qualtrics.com. The first survey represented a high level of university sponsorship, was preceded by an information page on our school website, and had the university name and logo featured prominently on every page, and in the survey URL. The second survey did not possess these characteristics, and represented a low level of university sponsorship. Overall, 23.9% of participants commenced, but did not complete the surveys. The proportion of participants that completed the high sponsorship survey did not differ to the proportion that completed the low sponsorship survey. Furthermore, the numbers of items answered by participants who completed the two surveys were equivalent. Overall, although it is disappointing that dropout rates cannot be reduced simply by enhancing academic survey sponsor visibility, researchers without ready access to university web-servers or logos will appreciate these findings.
Conference Homepage (abstract)
Web survey bibliography (4086)
- Media tracker; 2012
- Measuring the quality of governmental websites in a controlled versus an online setting with the ‘...; 2012; Elling, S., Lentz, L., de Jong , M., van den Bergh, H.
- Measuring modern media consumption; 2012; Arini, N.
- ISO 20252. Market, opinion and social research-Vocabulary and service requirements, 2nd Edition; 2012
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Internet use in households and by individual in 2012. Eurostat Statistics in Focus 50/2012; 2012; Seybert, H.
- Internet access - Households and individuals, 2012 part 2; 2012
- Internet access - Households and individuals, 2012; 2012
- Google et Médiamétrie créent une audience bimédia; 2012; Gonzales, P.
- GMI Pinnacle; 2012
- Global market research 2012; 2012
- Explaining rising nonresponse rates in cross-sectional surveys; 2012; Brick, J. M., Williams, Do.
- Eurobarometer Special surveys: Special Eurobarometer 381; 2012
- Online Surveys 2.0; 2012; Elferink, R.
- The Impact of Academic Sponsorship on Online Survey Dropout Rates; 2012; Allen, P. J., Roberts, L. D.
- Especially for You: Motivating Respondents in an Internet Panel by Offering Tailored Questions; 2012; Oudejans, M.
- Social media as a data collection tool: the impact of Facebook in behavioural research; 2012; Zoppos, E.
- Smartphone Apps and User Engagement: Collecting Data in the Digital Era; 2012; Link, M. W.
- Snowball Sampling in Online Social Networks; 2012; Raissi, M., Ackland, R.
- The Use of Facebook as a Locating and Contacting Tool; 2012; McCarthy, T.
- How Often Do You Use the App with a Bird on It? Exploring Differences in Survey Completion Times, Primacy...; 2012; Buskirk, T. D.
- Data quality of questions sensitive to social-desirability bias in web surveys; 2012; Lozar Manfreda, K., Zajc, N., Berzelak, N., Vehovar, V.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Social research in online context: methodological reflections on web surveys from a case study; 2012; Pandolfini, V.
- Efficacy of a health-related Facebook social network site on health-seeking behaviors; 2012; Woolley, P., Peterson, M.
- The war against unengaged online respondents; 2012; Gittelman, S. H., Trimarchi, E.
- Qualitatively Speaking: The five absolute, no-excuse must-dos for online qualitative researchers; 2012; Rossow, A.
- By the Numbers: Lessons for using online panels in B2B research; 2012; Elsner, N.
- Specialized Tools for Measuring Past Events ; 2012; Belli, R. F.
- Transparency, Access and the Credibility of Survey Research; 2012; Lupia, A.
- Can Microtargeting Improve Survey Sampling? An Assessment of Accuracy and Bias in Consumer File Marketing...; 2012; Pasek, J.
- Anonymity and Confidentiality; 2012; Tourangeau, R.
- Cognitive Evaluation of Survey Instruments: State of the Science (Art?) and Future Directions; 2012; Willis, G. B.
- Oh, Just One More Thing … Leveraging “Leave-Behinds” in Data Collection; 2012; Link, M. W.
- Paradata; 2012; Kreuter, F.
- Computation of Survey Weights: Bridging Theory and Practice; 2012; DeBell, M.
- Optimizing Response Rates; 2012; Brick, J. M.
- Modes of Data Collection; 2012; Tourangeau, R.
- The Use and Effects of Incentives in Surveys; 2012; Singer, E.
- Improving Question Design to Maximize Reliability and Validity; 2012; Krosnick, J. A.
- Respondent Attrition vs Data Attrition and Their Reduction; 2012; Olsen, R. J.
- Survey Interviewing: Deviations from the Script; 2012; Schaeffer, N. C.
- How accurate are surveys of objective phenomena?; 2012; Chang, L. C., Krosnick, J. A.
- Measure the response burden in the Swedish Intrastat system; 2012; Weideskog, F.
- Mode and non-response effects and their treatment; 2012; Chrysanthopoulos, S., Georgostathi, A.
- What can be said about quality in the Central Population Register based on a self-completion survey...; 2012; Falnes-Dalheim, E., Pedersen, H. E.
- Improving the quality of complex surveys: The case of the EU Labour Force Survey ; 2012; van der Valk, J.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.